Manifold Learning with Local Linear Embedding

Several techniques approximate a lower dimensional manifold. One example is locally-linear embedding (LLE) that was developed in 2000 by Sam Roweis and Lawrence Saul.

This notebook demonstrates how LLE ‘unrolls’ the swiss roll, and how it performs on other data.

For each data point, LLE identifies a given number of nearest neighbors and computes weights that represent each point as a linear combination of its neighbors. It finds a lower-dimensional embedding by linearly projecting each neighborhood on global internal coordinates on the lower-dimensional manifold and can be thought of as a sequence of PCA applications.

Imports & Settings

Manifold Examples

Linear Manifold: Ellipse in 3D

PCA: Linear Dimensionality Reduction

Swiss Roll Example

Linear cuts along the axes

Principal Component Analysis

But will manifold learning simplify the task at hand?

Local-Linear Embedding

S-Curve Example

Local-Linear Embedding

Local Linear Embedding: Standard

The following locally_linear_embedding on mnist.data takes fairly long to run, hence we are providing pre-computed results so you can explore the visualizations regardless of your hardware setup.

Load Fashion MNIST Data

Local Linear Embedding